From MAP to Marginals: Variational Inference in Bayesian Submodular Models
نویسندگان
چکیده
Submodular optimization has found many applications in machine learning andbeyond. We carry out the first systematic investigation of inference in probabilis-tic models defined through submodular functions, generalizing regular pairwiseMRFs and Determinantal Point Processes. In particular, we present L-FIELD, avariational approach to general log-submodular and log-supermodular distribu-tions based on suband supergradients. We obtain both lower and upper boundson the log-partition function, which enables us to compute probability intervalsfor marginals, conditionals and marginal likelihoods. We also obtain fully factor-ized approximate posteriors, at the same computational cost as ordinary submod-ular optimization. Our framework results in convex problems for optimizing overdifferentials of submodular functions, which we show how to optimally solve.We provide theoretical guarantees of the approximation quality with respect tothe curvature of the function. We further establish natural relations between ourvariational approach and the classical mean-field method. Lastly, we empiricallydemonstrate the accuracy of our inference scheme on several submodular models.
منابع مشابه
Scalable Variational Inference in Log-supermodular Models
We consider the problem of approximate Bayesian inference in log-supermodular models. These models encompass regular pairwise MRFs with binary variables, but allow to capture highorder interactions, which are intractable for existing approximate inference techniques such as belief propagation, mean field, and variants. We show that a recently proposed variational approach to inference in log-su...
متن کاملWorst-case Optimal Submodular Extensions for Marginal Estimation
Submodular extensions of an energy function can be used to efficiently compute approximate marginals via variational inference. The accuracy of the marginals depends crucially on the quality of the submodular extension. To identify the best possible extension, we show an equivalence between the submodular extensions of the energy and the objective functions of linear programming (LP) relaxation...
متن کاملApproximating Marginals Using Discrete Energy Minimization
We consider the problem of inference in a graphical model with binary variables. While in theory it is arguably preferable to compute marginal probabilities, in practice researchers often use MAP inference due to the availability of efficient discrete optimization algorithms. We bridge the gap between the two approaches by introducing the Discrete Marginals technique in which approximate margin...
متن کاملBethe Bounds and Approximating the Global Optimum
Inference in general Markov random fields (MRFs) is NP-hard, though identifying the maximum a posteriori (MAP) configuration of pairwise MRFs with submodular cost functions is efficiently solvable using graph cuts. Marginal inference, however, even for this restricted class, is in #P. We prove new formulations of derivatives of the Bethe free energy, provide bounds on the derivatives and bracke...
متن کاملProteins, Particles, and Pseudo-Max-Marginals: A Submodular Approach
Variants of max-product (MP) belief propagation effectively find modes of many complex graphical models, but are limited to discrete distributions. Diverse particle max-product (D-PMP) robustly approximates max-product updates in continuous MRFs using stochastically sampled particles, but previous work was specialized to treestructured models. Motivated by the challenging problem of protein sid...
متن کامل